-
Notifications
You must be signed in to change notification settings - Fork 67
refactor: llguidance #288
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
refactor: llguidance #288
Conversation
Merge ProtectionsYour pull request matches the following merge protections and will not be merged until they are valid. 🟢 Enforce conventional commitWonderful, this rule succeeded.Make sure that we follow https://www.conventionalcommits.org/en/v1.0.0/
|
jakelorocco
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like there's some mypy errors as well.
I don't know why there is a mypy error. Mypy passes locally for me. |
3ebb8bd to
8dce1f2
Compare
We've seen this happen recently and it came down to mypy versions. Could be the cause. I re-ran the checks on the latest branch. If those still fail and you still pass with mypy, try checking your mypy version. |
|
With an updated lockfile the mypy passes. Test is terminated but locally it is passing. If you approve, I will merge this |
…API, use llguidance
…lm, V1 API, use llguidance
…lm, V1 API, use llguidance
…lm, V1 API, use llguidance
…lm, V1 API, use llguidance
938d919 to
62923fe
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@guicho271828, can you remind me if we ever got the local vllm backend running on mac (or if you remember a discussion about that)?
I tried installing this version of the mellea package on my mac and was getting versioning errors.
Details
(mellea) ➜ mellea git:(pr/guicho271828/288) uv pip install -e '.[all]' --all-extras --group dev -r pyproject.toml
Using Python 3.12.0 environment at: /opt/homebrew/Caskroom/miniforge/base/envs/mellea
× No solution found when resolving dependencies:
╰─▶ Because only the following versions of nvidia-cudnn-frontend are available:
nvidia-cudnn-frontend<=1.13.0
nvidia-cudnn-frontend==1.14.0
nvidia-cudnn-frontend==1.14.1
nvidia-cudnn-frontend==1.15.0
nvidia-cudnn-frontend==1.16.0
nvidia-cudnn-frontend==1.17.0
and nvidia-cudnn-frontend>=1.13.0 has no wheels with a matching platform tag (e.g., `macosx_15_0_arm64`), we can conclude that
nvidia-cudnn-frontend>=1.13.0 cannot be used.
And because flashinfer-python==0.5.3 depends on nvidia-cudnn-frontend>=1.13.0 and vllm==0.13.0 depends on flashinfer-python==0.5.3, we
can conclude that vllm==0.13.0 cannot be used.
And because only vllm<=0.13.0 is available and mellea depends on vllm>=0.13.0, we can conclude that your requirements are
unsatisfiable.
It looks like the newest version I can install on a mac is 0.11.0.
The vllm engine can't seem to get enough resources to run locally on my mac anyways, so maybe we can just add a note somewhere that "mella[vllm]" doesn't work for macs?
To run vllm locally, vllm assumes
So yes |
29588fb disables installing vllm on darwin. |
No description provided.